Not so naive Bayesian classification
نویسندگان
چکیده
Of numerous proposals to improve the accuracy of naive Bayes by weakening its attribute independence assumption, both LBR and TAN have demonstrated remarkable error performance. However, both techniques obtain this outcome at a considerable computational cost. We present a new approach to weakening the attribute independence assumption by averaging all of a constrained class of classifiers. In extensive experiments this technique delivers comparable prediction accuracy to LBR and TAN with substantially improved computational efficiency.
منابع مشابه
A Validation Test Naive Bayesian Classification Algorithm and Probit Regression as Prediction Models for Managerial Overconfidence in Iran's Capital Market
Corporate directors are influenced by overconfidence, which is one of the personality traits of individuals; it may take irrational decisions that will have a significant impact on the company's performance in the long run. The purpose of this paper is to validate and compare the Naive Bayesian Classification algorithm and probit regression in the prediction of Management's overconfident at pre...
متن کاملA Bayesian mixture model for classification of certain and uncertain data
There are different types of classification methods for classifying the certain data. All the time the value of the variables is not certain and they may belong to the interval that is called uncertain data. In recent years, by assuming the distribution of the uncertain data is normal, there are several estimation for the mean and variance of this distribution. In this paper, we co...
متن کاملA Study of AdaBoost with Naive Bayesian Classifiers: Weakness and Improvement
This article investigates boosting naive Bayesian classification. It first shows that boosting does not improve the accuracy of the naive Bayesian classifier as much as we expected in a set of natural domains. By analyzing the reason for boosting’s weakness, we propose to introduce tree structures into naive Bayesian classification to improve the performance of boosting when working with naive ...
متن کاملBoosting and Naive Bayesian Learning
Although so-called “naive” Bayesian classification makes the unrealistic assumption that the values of the attributes of an example are independent given the class of the example, this learning method is remarkably successful in practice, and no uniformly better learning method is known. Boosting is a general method of combining multiple classifiers due to Yoav Freund and Rob Schapire. This pap...
متن کاملThe study on the spam filtering technology based on Bayesian algorithm
This paper analyzed spam filtering technology, carried out a detailed study of Naive Bayes algorithm, and proposed the improved Naive Bayesian mail filtering technology. Improvement can be seen in text selection as well as feature extraction. The general Bayesian text classification algorithm mostly takes information gain and cross-entropy algorithm in feature selection. Through the principle o...
متن کاملBayesian Network Classifiers. An Application to Remote Sensing Image Classification
Different probabilistic models for classification and prediction problems are anlyzed in this article studying their behaviour and capability in data classification. To show the capability of Bayesian Networks to deal with classification problems four types of Bayesian Networks are introduced, a General Bayesian Network, the Naive Bayes, a Bayesian Network Augmented Naive Bayes and the Tree Aug...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003